Second International Workshop on Rewriting Techniques for Program Transformations and Evaluation (WPTE'15)
نویسندگان
چکیده
Recently, a standardization theorem has been proven for a variant of Plotkin’s callby-value lambda-calculus extended by means of two commutation rules (sigma-reductions): this result was based on a partitioning between head and internal reductions. We study the head normalization for this call-by-value calculus with sigma-reductions and we relate it to the weak evaluation of original Plotkin’s call-by-value lambda-calculus. We give also a (non-deterministic) normalization strategy for the call-by-value lambda-calculus with sigma-reductions. Structural simplification of chemical reaction networks preserving deterministic semantics Authors: Guillaume Madelaine, Cédric Lhoussaine, and Joachim Niehren Abstract: We study the structural simplification of chemical reaction networks preserving the deterministic kinetics. We aim at finding simplification rules that can eliminate intermediate molecules while preserving the dynamics of all others. The rules should be valid even though the network is plugged into a bigger context. An example is MichaelisMenten’s simplification rule for enzymatic reactions. In this paper, we present structural simplification rules for reaction networks that can eliminate intermediate molecules at equilibrium, without assuming that all molecules are at equilibrium, i.e. in a steady state. Our simplification rules preserve the deterministic semantics of reaction networks, in all contexts compatible with the equilibrium of the eliminated molecules. We illustrate the simplification on a biological example network from systems biology. We study the structural simplification of chemical reaction networks preserving the deterministic kinetics. We aim at finding simplification rules that can eliminate intermediate molecules while preserving the dynamics of all others. The rules should be valid even though the network is plugged into a bigger context. An example is MichaelisMenten’s simplification rule for enzymatic reactions. In this paper, we present structural simplification rules for reaction networks that can eliminate intermediate molecules at equilibrium, without assuming that all molecules are at equilibrium, i.e. in a steady state. Our simplification rules preserve the deterministic semantics of reaction networks, in all contexts compatible with the equilibrium of the eliminated molecules. We illustrate the simplification on a biological example network from systems biology. A simple extension of the Curry-Howard correspondence with intuitionistic lambda rho calculus Author: Naosuke Matsuda Abstract: In (Fujita et.al., to appear), a natural deduction style proof system called “intuitionistic λρ-calculus” for implicational intuitionistic logic and some reduction rules for the proof system were given. In this paper, we show that the system is easy to treat but has sufficient expressive power to provide a powerful model of computation. In (Fujita et.al., to appear), a natural deduction style proof system called “intuitionistic λρ-calculus” for implicational intuitionistic logic and some reduction rules for the proof system were given. In this paper, we show that the system is easy to treat but has sufficient expressive power to provide a powerful model of computation. 2nd International Workshop on Rewriting Techniques for Program Transformations and Evaluation (WPTE’15). Editors: Yuki Chiba, Santiago Escobar, Naoki Nishida, David Sabel, and Manfred Schmidt-Schauß OpenAccess Series in Informatics Schloss Dagstuhl – Leibniz-Zentrum für Informatik, Dagstuhl Publishing, Germany xii The Collection of all Abstracts of the Talks at WPTE 2015 Towards Modelling Actor-Based Concurrency in Term Rewriting Authors: Adrián Palacios and Germán Vidal Abstract: In this work, we introduce a scheme for modelling actor systems within sequential term rewriting. In our proposal, a TRS consists of the union of three components: the functional part (which is specific of a system), a set of rules for reducing concurrent actions, and a set of rules for defining a particular scheduling policy. A key ingredient of our approach is that concurrent systems are modelled by terms in which concurrent actions can never occur inside user-defined function calls. This assumption greatly simplifies the definition of the semantics for concurrent actions, since no term traversal will be needed. We prove that these systems are well defined in the sense that concurrent actions can always be reduced. Our approach can be used as a basis for modelling actor-based concurrent programs, which can then be analyzed using existing techniques for term rewrite systems. In this work, we introduce a scheme for modelling actor systems within sequential term rewriting. In our proposal, a TRS consists of the union of three components: the functional part (which is specific of a system), a set of rules for reducing concurrent actions, and a set of rules for defining a particular scheduling policy. A key ingredient of our approach is that concurrent systems are modelled by terms in which concurrent actions can never occur inside user-defined function calls. This assumption greatly simplifies the definition of the semantics for concurrent actions, since no term traversal will be needed. We prove that these systems are well defined in the sense that concurrent actions can always be reduced. Our approach can be used as a basis for modelling actor-based concurrent programs, which can then be analyzed using existing techniques for term rewrite systems. Mechanizing Meta-Theory in Beluga Author: Brigitte Pientka Abstract: Mechanizing formal systems, given via axioms and inference rules, together with proofs about them plays an important role in establishing trust in formal developments. In this talk, I will survey the proof environment Beluga. To specify formal systems and represent derivations within them, Beluga provides a sophisticated infrastructure based on the logical framework LF; in particular, its infrastructure not only supports modelling binders via binders in LF, but extends and generalizes LF with first-class contexts to abstract over a set of assumptions, contextual objects to model derivations that depend on assumptions, and first-class simultaneous substitutions to relate contexts. These extensions allow us to directly support key and common concepts that frequently arise when describing formal systems and derivations within them. To reason about formal systems, Beluga provides a dependently typed functional language for implementing inductive proofs about derivations as recursive functions on contextual objects following the Curry-Howard isomorphism. Recently, the Beluga system has also been extended with a totality checker which guarantees that recursive programs are well-founded and correspond to inductive proofs and an interactive program development environment to support incremental proof / program construction. Taken together these extensions enable direct and compact mechanizations. To demonstrate Beluga’s strength, we develop a weak normalization proof using logical relations. The Beluga system together with examples is available from http://complogic.cs.mcgill.ca/beluga/. Mechanizing formal systems, given via axioms and inference rules, together with proofs about them plays an important role in establishing trust in formal developments. In this talk, I will survey the proof environment Beluga. To specify formal systems and represent derivations within them, Beluga provides a sophisticated infrastructure based on the logical framework LF; in particular, its infrastructure not only supports modelling binders via binders in LF, but extends and generalizes LF with first-class contexts to abstract over a set of assumptions, contextual objects to model derivations that depend on assumptions, and first-class simultaneous substitutions to relate contexts. These extensions allow us to directly support key and common concepts that frequently arise when describing formal systems and derivations within them. To reason about formal systems, Beluga provides a dependently typed functional language for implementing inductive proofs about derivations as recursive functions on contextual objects following the Curry-Howard isomorphism. Recently, the Beluga system has also been extended with a totality checker which guarantees that recursive programs are well-founded and correspond to inductive proofs and an interactive program development environment to support incremental proof / program construction. Taken together these extensions enable direct and compact mechanizations. To demonstrate Beluga’s strength, we develop a weak normalization proof using logical relations. The Beluga system together with examples is available from http://complogic.cs.mcgill.ca/beluga/. The Collection of all Abstracts of the Talks at WPTE 2015 xiii Observing Success in the Pi-Calculus Authors: David Sabel and Manfred Schmidt-Schauß Abstract: A contextual semantics – defined in terms of successful termination and mayand should-convergence – is analyzed in the synchronous pi-calculus with replication and a constant Stop to denote success. The contextual ordering is analyzed, some nontrivial process equivalences are proved, and proof tools for showing contextual equivalences are provided. Among them are a context lemma and new notions of sound applicative similarities for mayand should-convergence. A further result is that contextual equivalence in the pi-calculus with Stop conservatively extends barbed testing equivalence in the (Stop-free) pi-calculus and thus results on contextual equivalence can be transferred to the (Stop-free) pi-calculus with barbed testing equivalence. A contextual semantics – defined in terms of successful termination and mayand should-convergence – is analyzed in the synchronous pi-calculus with replication and a constant Stop to denote success. The contextual ordering is analyzed, some nontrivial process equivalences are proved, and proof tools for showing contextual equivalences are provided. Among them are a context lemma and new notions of sound applicative similarities for mayand should-convergence. A further result is that contextual equivalence in the pi-calculus with Stop conservatively extends barbed testing equivalence in the (Stop-free) pi-calculus and thus results on contextual equivalence can be transferred to the (Stop-free) pi-calculus with barbed testing equivalence. Context-Moving Transformation for Term Rewriting Systems Authors: Koichi Sato, Kentaro Kikuchi, Takahito Aoto, and Yoshihito Toyama Abstract: Proofs by induction are often incompatible with tail-recursive definitions as the accumulator changes in the course of unfolding the definitions. Context-moving (Giesl, 2000) for functional programs transforms tail-recursive programs into non tail-recursive ones which are more suitable for verification. In this work, we formulate a context-moving transformation for term rewriting systems, and prove the correctness with respect to both eager evaluation semantics and initial algebra semantics under some conditions on the programs to be transformed. Proofs by induction are often incompatible with tail-recursive definitions as the accumulator changes in the course of unfolding the definitions. Context-moving (Giesl, 2000) for functional programs transforms tail-recursive programs into non tail-recursive ones which are more suitable for verification. In this work, we formulate a context-moving transformation for term rewriting systems, and prove the correctness with respect to both eager evaluation semantics and initial algebra semantics under some conditions on the programs to be transformed. Formalizing Bialgebraic Semantics in PVS 6.0 Authors: Sjaak Smetsers, Ken Madlener, and Marko van Eekelen Abstract: Both operational and denotational semantics are prominent approaches for reasoning about properties of programs and programming languages. In the categorical framework developed by Turi and Plotkin both styles of semantics are unified using a single, syntax independent format, known as GSOS, in which the operational rules of a language are specified. From this format, the operational and denotational semantics are derived. The approach of Turi and Plotkin is based on the categorical notion of bialgebras. In this paper we specify this work in the theorem prover PVS, and prove the adequacy theorem of this formalization. One of our goals is to investigate whether PVS is adequately suited for formalizing metatheory. Indeed, our experiments show that the original categorical framework can be formalized conveniently. Additionally, we present a GSOS specification for the simple imperative programming language While, and execute the derived semantics for a small example program. Both operational and denotational semantics are prominent approaches for reasoning about properties of programs and programming languages. In the categorical framework developed by Turi and Plotkin both styles of semantics are unified using a single, syntax independent format, known as GSOS, in which the operational rules of a language are specified. From this format, the operational and denotational semantics are derived. The approach of Turi and Plotkin is based on the categorical notion of bialgebras. In this paper we specify this work in the theorem prover PVS, and prove the adequacy theorem of this formalization. One of our goals is to investigate whether PVS is adequately suited for formalizing metatheory. Indeed, our experiments show that the original categorical framework can be formalized conveniently. Additionally, we present a GSOS specification for the simple imperative programming language While, and execute the derived semantics for a small example program.
منابع مشابه
Contextual Equivalences in Call-by-Need and Call-By-Name Polymorphically Typed Calculi (Preliminary Report)
This paper presents a call-by-need polymorphically typed lambda-calculus with letrec, case, constructors and seq. The typing of the calculus is modelled in a system-F style. Contextual equivalence is used as semantics of expressions. We also define a call-by-name variant without letrec. We adapt several tools and criteria for recognizing correct program transformations to polymorphic typing, in...
متن کاملFirst International Workshop on Rewriting Techniques for Program Transformations and Evaluation, WPTE 2014, July 13, 2014, Vienna, Austria
Transforming conditional term rewrite systems (CTRSs) into unconditional systems (TRSs) is a common approach to analyze properties of CTRSs via the simpler framework of unconditional rewriting. In the past many different transformations have been introduced for this purpose. One class of transformations, so-called unravelings, have been analyzed extensively in the past. In this paper we provide...
متن کاملConfluence of Conditional Term Rewrite Systems via Transformations
Conditional term rewriting is an intuitive yet complex extension of term rewriting. In order to benefit from the simpler framework of unconditional rewriting, transformations have been defined to eliminate the conditions of conditional term rewrite systems. Recent results provide confluence criteria for conditional term rewrite systems via transformations, yet they are restricted to CTRSs with ...
متن کاملNotes on Structure-Preserving Transformations of Conditional Term Rewrite Systems
Transforming conditional term rewrite systems (CTRSs) into unconditional systems (TRSs) is a common approach to analyze properties of CTRSs via the simpler framework of unconditional rewriting. In the past many different transformations have been introduced for this purpose. One class of transformations, so-called unravelings, have been analyzed extensively in the past. In this paper we provide...
متن کاملHOR 2012 6th International Workshop on Higher-Order Rewriting
The aim of HOR is to provide an informal forum to discuss all aspects of higher-order rewriting. The topics of the workshop include applications, foundations, frameworks, implementations , and semantics. HOR is a biannual meeting. The present volume provides final versions of six accepted contributed extended abstracts of talks selected for the workshop. HOR 2012 had also a tool session. The fo...
متن کاملTransformation techniques for context-sensitive rewrite systems
Context-sensitive rewriting is a computational restriction of term rewriting used to model non-strict (lazy) evaluation in functional programming. The goal of this paper is the study and development of techniques to analyze the termination behavior of context-sensitive rewrite systems. For that purpose, several methods have been proposed in the literature which transform contextsensitive rewrit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015